Bootstrap Asymptotics

نویسنده

  • Rudolf Beran
چکیده

The bootstrap, introduced by Efron (1979), merges simulation with formal model-based statistical inference. A statistical model for a sample Xn of size n is a family of distributions {Pθ,n : θ ∈ Θ}. The parameter space Θ is typically metric, possibly infinite-dimensional. The value of θ that identifies the true distribution from which Xn is drawn is unknown. Suppose that θ̂n = θ̂n(Xn) is a consistent estimator of θ. The bootstrap idea is: (a) Create an artificial bootstrap world in which the true parameter value is θ̂n and the sample X ∗ n is generated from the fitted model Pθ̂n,n. That is, the conditional distribution of X∗ n, given the data Xn, is Pθ̂n,n. (b) Act as if a sampling distribution computed in the fully known bootstrap world is a trustworthy approximation to the corresponding, but unknown, sampling distribution in the model world. For example, consider constructing a confidence set for a parametric function τ(θ), whose range is the set T . As in the classical pivotal method, let Rn(Xn, τ(θ)) be a specified root, a real-valued function of the sample and τ(θ). Let Hn(θ) be the sampling distribution of the root under the model. The bootstrap distribution of the root is Hn(θ̂n), a random probability measure that can also be viewed as the conditional distribution of Rn(X ∗ n, τ(θ̂n)) given the sample Xn. An associated bootstrap confidence set for τ(θ), of nominal coverage probability β, is then Cn,B = {t ∈ T : Rn(Xn, t) ≤ H −1 n (β, θ̂n)}. The quantile on the right can be approximated, for instance, by Monte Carlo techniques. The intuitive expectation is that the coverage probability of Cn,B will be close to β whenever θ̂n is close to θ. When does the bootstrap approach work? Bootstrap samples are perturbations of the data from which they are generated. If the goal is to probe how a statistical procedure performs on data sets similar to the one at hand, then repeating the statistical procedure on bootstrap samples stands to be instructive. An exploratory rationale for the bootstrap appeals intellectually when empirically supported probability models for the data are lacking. Indeed, the literature on “statistical inference” continues to struggle with an uncritical tendency to view data as a random sample from a statistical model known to the statistician apart

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Standard errors in covariance structure models: asymptotics versus bootstrap.

Commonly used formulae for standard error (SE) estimates in covariance structure analysis are derived under the assumption of a correctly specified model. In practice, a model is at best only an approximation to the real world. It is important to know whether the estimates of SEs as provided by standard software are consistent when a model is misspecified, and to understand why if not. Bootstra...

متن کامل

Change-point in stochastic design regression and the bootstrap

In this paper we study the consistency of different bootstrap procedures for constructing confidence intervals (CIs) for the unique jump discontinuity (change-point) in an otherwise smooth regression function in a stochastic design setting. This problem exhibits nonstandard asymptotics and we argue that the standard bootstrap procedures in regression fail to provide valid confidence intervals f...

متن کامل

Bootstrap percolation on the random graph Gn,p

Bootstrap percolation on the random graph Gn,p is a process of spread of “activation” on a given realization of the graph with a given number of initially active nodes. At each step those vertices which have not been active but have at least r ≥ 2 active neighbours become active as well. We study the size A∗ of the final active set. The parameters of the model are, besides r (fixed) and n (tend...

متن کامل

HIGHER-ORDER IMPROVEMENTS OF A COMPUTATIONALLY ATTRACTIVE k-STEP BOOTSTRAP FOR EXTREMUM ESTIMATORS

This paper establishes the higher-order equivalence of the k-step bootstrap, introduced recently by Davidson and MacKinnon (1999), and the standard bootstrap. The k-step bootstrap is a very attractive alternative computationally to the standard bootstrap for statistics based on nonlinear extremum estimators, such as generalized method of moment and maximum likelihood estimators. The paper also ...

متن کامل

HIGHER-ORDER IMPROVEMENTS OF A COMPUTATIONALLY ATTRACTIVE k-STEP BOOTSTRAP FOR EXTREMUM ESTIMATORS By

1 This paper establishes the higher-order equivalence of the k-step bootstrap, introduced recently by Davidson and MacKinnon (1999), and the standard bootstrap. The k-step bootstrap is a very attractive alternative computationally to the standard bootstrap for statistics based on nonlinear extremum estimators, such as generalized method of moment and maximum likelihood estimators. The paper als...

متن کامل

A Bootstrap Theory for Weakly Integrated Processes

This paper develops a bootstrap theory for models including autoregressive time series with roots approaching to unity as the sample size increases. In particular, we consider the processes with roots converging to unity with rates slower than n−1. We call such processes weakly integrated processes. It is established that the bootstrap relying on the estimated autoregressive model is generally ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011